The Support Vector Machine (SVM) has been used in a wide variety ofclassification problems. The original SVM uses the hinge loss function, whichis non-differentiable and makes the problem difficult to solve in particularfor regularized SVMs, such as with $\ell_1$-regularization. This paperconsiders the Huberized SVM (HSVM), which uses a differentiable approximationof the hinge loss function. We first explore the use of the Proximal Gradient(PG) method to solving binary-class HSVM (B-HSVM) and then generalize it tomulti-class HSVM (M-HSVM). Under strong convexity assumptions, we show that ouralgorithm converges linearly. In addition, we give a finite convergence resultabout the support of the solution, based on which we further accelerate thealgorithm by a two-stage method. We present extensive numerical experiments onboth synthetic and real datasets which demonstrate the superiority of ourmethods over some state-of-the-art methods for both binary- and multi-classSVMs.
展开▼